Communications Inspired Linear Discriminant Analysis
نویسندگان
چکیده
We study the problem of supervised linear dimensionality reduction, taking an information-theoretic viewpoint. The linear projection matrix is designed by maximizing the mutual information between the projected signal and the class label. By harnessing a recent theoretical result on the gradient of mutual information, the above optimization problem can be solved directly using gradient descent, without requiring simplification of the objective function. Theoretical analysis and empirical comparison are made between the proposed method and two closely related methods, and comparisons are also made with a method in which Rényi entropy is used to define the mutual information (in this case the gradient may be computed simply, under a special parameter setting). Relative to these alternative approaches, the proposed method achieves promising results on real datasets.
منابع مشابه
Subclass discriminant Nonnegative Matrix Factorization for facial image analysis
Nonnegative Matrix Factorization (NMF) is among the most popular subspace methods, widely used in a variety of image processing problems. Recently, a discriminant NMF method that incorporates Linear Discriminant Analysis inspired criteria has been proposed, which achieves an efficient decomposition of the provided data to its discriminant parts, thus enhancing classification performance. Howeve...
متن کاملAn Improved Odor Recognition System Using Learning Vector Quantization with a New Discriminant Analysis
A high-performance biologically-inspired odor identification system is described. As a means of odor recognition, learning vector quantization (LVQ) algorithm is employed. Performance improvement is obtained with the use of a preprocessing with discriminant analysis of input samples. Due to sample-based decision, the system can be reliably operated as a real-time electronic nose.
متن کاملKernel Alignment Inspired Linear Discriminant Analysis
Kernel alignment measures the degree of similarity between two kernels. In this paper, inspired from kernel alignment, we propose a new Linear Discriminant Analysis (LDA) formulation, kernel alignment LDA (kaLDA). We first define two kernels, data kernel and class indicator kernel. The problem is to find a subspace to maximize the alignment between subspace-transformed data kernel and class ind...
متن کاملFisher’s Linear Discriminant Analysis for Weather Data by reproducing kernel Hilbert spaces framework
Recently with science and technology development, data with functional nature are easy to collect. Hence, statistical analysis of such data is of great importance. Similar to multivariate analysis, linear combinations of random variables have a key role in functional analysis. The role of Theory of Reproducing Kernel Hilbert Spaces is very important in this content. In this paper we study a gen...
متن کاملChoice of B-splines with free parameters in the flexible discriminant analysis context
Flexible discriminant analysis (FDA) is a general methodology which aims at providing tools for multigroup non linear classification. It consists in a nonparametric version of discriminant analysis by replacing linear regression by any nonparametric regression method. A new option for FDA, consisting in a nonparametric regression method based on B-spline functions, will be introduced. The relev...
متن کامل